Mem. S.A.It. Vol.

نویسنده

  • M. Arnaud
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mem. S.A.It. Vol. 00, 0

We present abundances of carbon and oxygen as well as abundance ratios 12C/13C for a sample of carbon stars in the LMC, SMC, Carina, Sculptor and Fornax dwarf galaxies. The overall metallicities in these dwarf galaxies are lower than in the galactic disc. The observations cover most of the AGB and we discuss the abundance patterns in different regions along the AGB. The abundances are determine...

متن کامل

Plane Homaloidal Families of General Degree

Cremona, Mem. Bologna (2), Vol. 5 (1865), pp. 3-35. Noether, Math. Ann., Vol. 3 (1870), p. 166. Ruffini, Mem. Bologna (3), Vol. 8 (1877), pp. 457-525. Bianchi, Giorn. di Mat., Vol. 16 (1878), pp. 263-266. De Jonquieres, Paris C. R., Vol. 101 (1885), pp. 720-724. Montesano, Rendic. Napoli (3), Vol. 11 (1905), p. 269. Larice, Periodico di Mat. (3), Vol. 6 (1909), pp. 234-236. Palatini, Periodico ...

متن کامل

Maximum Entropy Analysis of the Spectral Functions in Lattice Qcd

First principle calculation of the QCD spectral functions (SPFs) based on the lattice QCD simulations is reviewed. Special emphasis is placed on the Bayesian inference theory and the Maximum Entropy Method (MEM), which is a useful tool to extract SPFs from the imaginary-time correlation functions numerically obtained by the Monte Carlo method. Three important aspects of MEM are (i) it does not ...

متن کامل

IEEE TRANSACTIONS ON INFORMATION THEORY VOL XX NO Y MONTH Error Bounds for Functional Approximation and Estimation Using Mixtures of Experts

We examine some mathematical aspects of learning unknown mappings with the Mixture of Experts Model MEM Speci cally we observe that the MEM is at least as powerful as a class of neural networks in a sense that will be made precise Upper bounds on the approximation error are established for a wide class of target functions The general theorem states that inf kf fnkp c n r d holds uniformly for f...

متن کامل

IEEE TRANSACTIONS ON INFORMATION THEORY VOL NO MAY Error Bounds for Functional Approximation and Estimation Using Mixtures of Experts

We examine some mathematical aspects of learning unknown mappings with the Mixture of Experts Model MEM Speci cally we observe that the MEM is at least as powerful as a class of neural networks in a sense that will be made precise Upper bounds on the approximation error are established for a wide class of target functions The general theorem states that kf fnkp c n r d for f W r p L a Sobolev c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003